low rank matrix recovery
Global Optimality of Local Search for Low Rank Matrix Recovery
We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrix recovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to a global optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergence guarantee for stochastic gradient descent {\em from random initialization}.
Reviews: Global Optimality of Local Search for Low Rank Matrix Recovery
This is a nice result. I am going to list a few nits I had about the paper as I read along. I think addressing some of these points would improve the presentation of the paper. There are a few cases which are not covered by the results. For instance, strict-saddle in noisy case local min are close to global in high rank, noisy case. A discussion about why these cases are not covered would be nice; I am assuming that it is not just straightforward modification of the current proof? 2. In practice, I believe that random init gradient descent without noise is sufficient.
Global Optimality of Local Search for Low Rank Matrix Recovery
Bhojanapalli, Srinadh, Neyshabur, Behnam, Srebro, Nati
We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrix recovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to a global optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergence guarantee for stochastic gradient descent {\em from random initialization}. Papers published at the Neural Information Processing Systems Conference.
Online Robust Low Rank Matrix Recovery
Guo, Xiaojie (Chinese Academy of Sciences)
Low rank matrix recovery has shown its importance as a theoretic foundation in many areas of information processing. Its solutions are usually obtained in batch mode that requires to load all the data into memory during processing, and thus are hardly applicable on large scale data. Moreover, a fraction of data may be severely contaminated by outliers, which makes accurate recovery significantly more challenging. This paper proposes a novel online robust low rank matrix recovery method to address these difficulties. In particular, we first introduce an online algorithm to solve the problem of low rank matrix completion. Then we move on to low rank matrix recovery from observations with intensive outliers. The outlier support is robustly estimated from a perspective of mixture model. Experiments on both synthetic and real data are conducted to demonstrate the efficacy of our method and show its superior performance over the state-of-the-arts.
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.46)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (0.32)